6,546 research outputs found

    Examining Race Differences in Blood Pressure Control among People with Chronic Kidney Disease

    Full text link
    Of Chronic Kidney Disease (CKD) patients, 20% of them also have hypertension (HTN). African Americans (AA) are known to be more at risk of CKD development and poor HTN control compared to Whites, largely due to their higher prevalence of diabetes and HTN. While those health conditions are a known risk factor to CKD, it is less clear if there is a race difference in HTN control among CKD patients. Using a combined 1999-2014 data set from The National Health and Nutrition Examination Survey (NHANES), we sought to determine if there is an association between race and HTN control among CKD patients. A smaller portion of AA CKD patients (58.2% vs 71.6%; p\u3c0.001) had controlled hypertension than White CKD patients. After adjusting for age, AA had a lower odds of having their hypertension controlled (odds ratio (OR) = 0.58; 95% confidence interval (CI): 0.37-0.92) relative to whites. When adjusting for social factors and medical conditions, we observed that hypertensive African Americans with CKD had similar odds of having their hypertension controlled (OR=0.55; 95% CI= 0.25-1.23) relative their White peers. Social factors and medical conditions account for the race difference in hypertension control among CKD patients. Strategies to control hypertension among AA patients with CKD must include not only efforts for proper health care to treat and control medical conditions, such as diabetes and stroke, but to address social factors. The results highlight the importance of creating interventions specifically focused on chronic disease prevention and management for African American adults to attempt to delay the onset or impede the progression of CKD

    Measuring Global Similarity between Texts

    Get PDF
    We propose a new similarity measure between texts which, contrary to the current state-of-the-art approaches, takes a global view of the texts to be compared. We have implemented a tool to compute our textual distance and conducted experiments on several corpuses of texts. The experiments show that our methods can reliably identify different global types of texts.Comment: Submitted to SLSP 201

    Alien Registration- Savoy, Arthur J. (Orono, Penobscot County)

    Get PDF
    https://digitalmaine.com/alien_docs/5955/thumbnail.jp

    Alien Registration- Savoy, George J. (Bangor, Penobscot County)

    Get PDF
    https://digitalmaine.com/alien_docs/10824/thumbnail.jp

    PB1061 Soil Testing

    Get PDF

    Chiral Extensions of the MSSM

    Full text link
    We present a class of extensions of the MSSM characterized by a fully chiral field content (no mu-terms) and no baryon or lepton number violating term in the superpotential due to an extra U'(1) gauge symmetry. The minimal model consist of the usual matter sector with family dependent U'(1) charges, six Higgs weak doublets, and three singlets required to give masses to the Higgsinos and cancel anomalies. We discuss its main features such as the tree level mass spectrum and the constraints on flavor changing processes.Comment: 13 pages. V2: Superpotential and U'(1) charges changed. Analysis of the spectrum for the new model added. References update

    Backscattered Electron (BSE) Imaging in the Scanning Electron Microscope (SEM) - Measurement of Surface Layer Mass-Thickness

    Get PDF
    Sometimes, the sample to be examined in the SEM will consist of a compositionally non-uniform substrate that is covered by an approximately uniform surface layer. With a low enough incident beam energy, only the surface layer can be seen in the SEM image. The underlying structure can be seen in the secondary electron (SE) image if the range of the incident electrons is greater than twice the thickness of the surface film. In the backscattered electron (BSE) image the threshold energy is higher because the BSE detector is insensitive to slow electrons. The information depth in the BSE image was investigated experimentally as a function of incident energy and BSE detector position using test specimens in which an Al layer of thickness either 210 or 1,100 nm was deposited onto an aluminised Si wafer covered by a pattern of gold lines. It was estimated that a lower limit to the surface mass-thickness that can be measured using a solid-state BSE detector is ~ I0ÎŒg/cm2 (=40 nm of Al) for the BSE method, as compared with ~ 0.25 ÎŒg/cm2 (=1 nm of Al) for the low-loss electron method. There would seem to be no reason why measurements by the BSE method could not be carried out automatically in a computer-controlled SEM equipped with image analysis and using the standard BSE detector systems, to measure the mass-thickness of a surface layer

    A Rationale for Long-lived Quarks and Leptons at the LHC: Low Energy Flavour Theory

    Get PDF
    In the framework of gauged flavour symmetries, new fermions in parity symmetric representations of the standard model are generically needed for the compensation of mixed anomalies. The key point is that their masses are also protected by flavour symmetries and some of them are expected to lie way below the flavour symmetry breaking scale(s), which has to occur many orders of magnitude above the electroweak scale to be compatible with the available data from flavour changing neutral currents and CP violation experiments. We argue that, actually, some of these fermions would plausibly get masses within the LHC range. If they are taken to be heavy quarks and leptons, in (bi)-fundamental representations of the standard model symmetries, their mixings with the light ones are strongly constrained to be very small by electroweak precision data. The alternative chosen here is to exactly forbid such mixings by breaking of flavour symmetries into an exact discrete symmetry, the so-called proton-hexality, primarily suggested to avoid proton decay. As a consequence of the large value needed for the flavour breaking scale, those heavy particles are long-lived and rather appropriate for the current and future searches at the LHC for quasi-stable hadrons and leptons. In fact, the LHC experiments have already started to look for them.Comment: 10 pages, 1 figur

    Investigating the need for clinicians to use tablet computers with a newly envisioned electronic health record

    Get PDF
    Objective: The Veterans Health Administration (VHA) has deployed a large number of tablet computers in the last several years. However, little is known about how clinicians may use these devices with a newly planned Web-based electronic health record (EHR), as well as other clinical tools. The objective of this study was to understand the types of use that can be expected of tablet computers versus desktops. Methods: Semi-structured interviews were conducted with 24 clinicians at a Veterans Health Administration (VHA) Medical Center. Results: An inductive qualitative analysis resulted in findings organized around recurrent themes of: (1) Barriers, (2) Facilitators, (3) Current Use, (4) Anticipated Use, (5) Patient Interaction, and (6) Connection. Conclusions: Our study generated several recommendations for the use of tablet computers with new health information technology tools being developed. Continuous connectivity for the mobile device is essential to avoid interruptions and clinician frustration. Also, making a physical keyboard available as an option for the tablet was a clear desire from the clinicians. Larger tablets (e.g., regular size iPad as compared to an iPad mini) were preferred. Being able to use secure messaging tools with the tablet computer was another consistent finding. Finally, more simplicity is needed for accessing patient data on mobile devices, while balancing the important need for adequate security

    MIRACLE at Ad-Hoc CLEF 2005: Merging and Combining Without Using a Single Approach

    Get PDF
    This paper presents the 2005 Miracle’s team approach to the Ad-Hoc Information Retrieval tasks. The goal for the experiments this year was twofold: to continue testing the effect of combination approaches on information retrieval tasks, and improving our basic processing and indexing tools, adapting them to new languages with strange encoding schemes. The starting point was a set of basic components: stemming, transforming, filtering, proper nouns extraction, paragraph extraction, and pseudo-relevance feedback. Some of these basic components were used in different combinations and order of application for document indexing and for query processing. Second-order combinations were also tested, by averaging or selective combination of the documents retrieved by different approaches for a particular query. In the multilingual track, we concentrated our work on the merging process of the results of monolingual runs to get the overall multilingual result, relying on available translations. In both cross-lingual tracks, we have used available translation resources, and in some cases we have used a combination approach
    • 

    corecore